A MEMORYLESS SYMMETRIC RANK-ONE METHOD WITH SUFFICIENT DESCENT PROPERTY FOR UNCONSTRAINED OPTIMIZATION

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Memoryless Modified Symmetric Rank-One Method for Large-Scale Unconstrained Optimization

Problem statement: Memoryless QN methods have been regarded effective techniques for solving large-scale problems that can be considered as one step limited memory QN methods. In this study, we present a scaled memoryless modified Symmetric Rank-One (SR1) algorithm and investigate the numerical performance of the proposed algorithm for solving large-scale unconstrained optimization problems. Ap...

متن کامل

Scaled memoryless symmetric rank one method for large-scale optimization

This paper concerns the memoryless quasi-Newton method, that is precisely the quasi-Newton method for which the approximation to the inverse of Hessian, at each step, is updated from the identity matrix. Hence its search direction can be computed without the storage of matrices. In this paper, a scaled memoryless symmetric rank one (SR1) method for solving large-scale unconstrained optimization...

متن کامل

Structured symmetric rank-one method for unconstrained optimization

In this paper, we investigate a symmetric rank-one (SR1) quasi-Newton (QN) formula in which the Hessian of the objective function has some special structure. Instead of approximating the whole Hessian via the SR1 formula, we consider an approach which only approximates part of the Hessian matrix that is not easily acquired. Although the SR1 update possesses desirable features, it is unstable in...

متن کامل

A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization

Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. In this paper, we propose a general form of three-term conjugate gradient methods which always generate a sufficient descent direction. We give a sufficient condition for the global convergence of the proposed general method. Moreover, we pres...

متن کامل

A New Sufficient Descent Conjugate Gradient Method for Unconstrained Optimization

In this paper, a new conjugate conjugate method with sufficient descent property is proposed for the unconstrained optimization problem. An attractive property of the new method is that the descent direction generated by the method always possess the sufficient descent property, and this property is independent of the line search used and the choice of ki  . Under mild conditions, the global c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of the Operations Research Society of Japan

سال: 2018

ISSN: 0453-4514,2188-8299

DOI: 10.15807/jorsj.61.53